The Data Architect will be responsible for providing guidance, maintenance, and deployment to customers in the structure, integration, monitoring, storage, and governance of data.
What Sets you Apart:
- You will have experience with creating data pipelines and flows to and between business applications and data repositories.
- You will have a technical and business understanding of approaches to industry leading deployments of Data Lakes, Data Warehouses, Data Marts, and the progression of data from raw/bronze to curated/gold environments with an interest in all tools/technologies including Python, Matillion, Snowflake, Databricks and Azure.
- Approaches and expertise in high-transaction volume deployments is a requirement. It is beneficial to understand Data Modeling and tabular object model or semantic layer development as well as critical optimization of Microsoft Power BI.
- You will also support the deployment of our business applications and migrations / integrations with Finance customers as well as the Enterprise market space.
- You will contribute to the development of unique repeatable offerings as they relate to both horizontal and vertical industry customers.
The candidate must be self-directed and comfortable supporting the data needs of multiple customer types, teams, systems, and products. Excellent written and verbal skills are a requirement with total comfort presenting and defending assessments, designs, approaches, and technology choices. Being naturally curious and technology-focused is a key to success with a desire to explore, understand, and deploy AI/ML models.
As Data Architect you will:
- Participate in design, development and implementation of architectural deliverables, including components of the assessment and optimization of system design and review of user requirements.
- Provide technical knowledge and capabilities as a team member and individual contributor.
- Maintain current and future state architecture.
- Be responsible for designing, creating, deploying and managing how the data will be stored, consumed, integrated and managed by different data entities and IT systems, as well as any applications using or processing that data in some way.
- Assist in designing new architectures and re-designing existing architectures, including data structures and database code. Design and build relational databases.
- Perform data access analysis design, and archive/recovery design and implementation, especially in a Microsoft SQL Server environment. Able to define, design, and build dimensional databases.
- Design and implement all database objects including schemas, tables, clusters, indexes, views, sequences, packages and procedures based on system requirements.
- Contribute to data warehousing blueprints, evaluating hardware and software platforms, and integrating systems. Develop strategies for data acquisitions, archive recovery, and implementation of a database. Translate business needs into long-term architecture solutions.
- Managing input of various data sources, definition of database architecture, development of database code, construction and maintenance of ETL based processes, design of database reporting, construction of data flow diagrams and creation of documentation manuals.
- Creating code for database modification, constructing and accessing databases using stored procedures, triggers and functions; assist in developing ETL processes.
- Participate in multiple projects with competing deadlines.
- Influence and direct activities of a team related to special initiatives or operations.
Job Specifications
- Proficiency in using multiple data engineering tools and technologies, like Python, Matillion, Databricks, SQL and cloud-based pipelines.
- 4+ years of experience in the disciplines associated with data analytics including Finance and Accounting.
- 4+ years of direct experience in Big Data, Data Warehousing, Data Analytics, and/or Information Management related projects, Snowflake preferred.
- 2+ years of direct experience in cloud data solution architectures, design and development including ETL, data warehousing, data lakes, and big data.
- 2+ years of experience using SQL including development of stored procedures, functions, triggers and views.
- 2+ years of Matillion, Databricks, Python and Snowflake experience.
- Experience in structuring solutions for high-scale data environments that support high transaction volume and scalable BI-driven data architecture.
- Knowledge of Data Governance principles.
- Complete comfort, understanding, and conversancy in articulating the ROI on corporate data initiatives.
Education
- Bachelor’s degree in Computer Science, Computer Engineering, or Software Engineering.
Location
- Applicants must be local to Richmond, VA area. Team meets on-site occasionally, no weekly requirements.